# High-precision semantic retrieval
BGE M3 Ko Gguf
Apache-2.0
BGE-m3-ko is a multilingual embedding model optimized for Korean and English, focusing on efficient semantic retrieval tasks.
Text Embedding Supports Multiple Languages
B
NeuroWhAI
76
3
Llm2vec Sheared LLaMA Mntp Supervised
MIT
LLM2Vec-Sheared-LLaMA-supervised is a supervised learning model based on the Sheared-LLaMA architecture, focusing on sentence similarity tasks and providing functions such as text embedding, information retrieval, and text classification.
Text Embedding English
L
McGill-NLP
648
5
Mmlw E5 Base
Apache-2.0
mmlw-e5-base is a feature extraction model based on sentence transformers, focusing on sentence similarity tasks and supporting Polish.
Text Embedding
Transformers Other

M
sdadas
37
1
SGPT 5.8B Weightedmean Nli Bitfit
SGPT-5.8B is a sentence transformer model based on a 5.8B parameter scale, optimized through weighted mean and NLI (Natural Language Inference) fine-tuning, specifically designed for sentence similarity calculation and feature extraction tasks.
Text Embedding
S
Muennighoff
86
6
Featured Recommended AI Models